Function, Gradient, and Hessian Recovery Using Quadratic Edge-Bump Functions

نویسنده

  • Jeffrey S. Ovall
چکیده

An approximate error function for the discretization error on a given mesh is obtained by projecting (via the energy inner product) the functional residual onto the space of continuous, piecewise quadratic functions which vanish on the vertices of the mesh. Conditions are given under which one can expect this hierarchical basis error estimator to give efficient and reliable function recovery, asymptotically exact gradient recovery and convergent Hessian recovery in the square norms. One does not find similar function recovery results in the literature. The analysis given here is based on a certain superconvergence result which has been used elsewhere in the analysis of gradient recovery methods. Numerical experiments are provided which demonstrate the effectivity of the approximate error function in practice.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Edge Detection with Hessian Matrix Property Based on Wavelet Transform

In this paper, we present an edge detection method based on wavelet transform and Hessian matrix of image at each pixel. Many methods which based on wavelet transform, use wavelet transform to approximate the gradient of image and detect edges by searching the modulus maximum of gradient vectors. In our scheme, we use wavelet transform to approximate Hessian matrix of image at each pixel, too. ...

متن کامل

Mode-Finding for Mixtures of Gaussian Distributions

I consider the problem of finding all the modes of a mixture of multivariate Gaussian distributions, which has applications in clustering and regression. I derive exact formulas for the gradient and Hessian and give a partial proof that the number of modes cannot be more than the number of components, and are contained in the convex hull of the component centroids. Then, I develop two exhaustiv...

متن کامل

Convergence Rate of an Optimization Algorithm for Minimizing Quadratic Functions with Separable Convex Constraints

A new active set algorithm for minimizing quadratic functions with separable convex constraints is proposed by combining the conjugate gradient method with the projected gradient. It generalizes recently developed algorithms of quadratic programming constrained by simple bounds. A linear convergence rate in terms of the Hessian spectral condition number is proven. Numerical experiments, includi...

متن کامل

Continuous Discrete Variable Optimization of Structures Using Approximation Methods

Optimum design of structures is achieved while the design variables are continuous and discrete. To reduce the computational work involved in the optimization process, all the functions that are expensive to evaluate, are approximated. To approximate these functions, a semi quadratic function is employed. Only the diagonal terms of the Hessian matrix are used and these elements are estimated fr...

متن کامل

Fast large-scale optimization by unifying stochastic gradient and quasi-Newton methods

We present an algorithm for minimizing a sum of functions that combines the computational efficiency of stochastic gradient descent (SGD) with the second order curvature information leveraged by quasi-Newton methods. We unify these disparate approaches by maintaining an independent Hessian approximation for each contributing function in the sum. We maintain computational tractability and limit ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • SIAM J. Numerical Analysis

دوره 45  شماره 

صفحات  -

تاریخ انتشار 2007